How to Track and Debug Job Queue Failures in Business Central for a Cameroon-Based Consulting Company
Summary This blog explains how to effectively track and debug job queue failures in Microsoft Dynamics 365 Business Central. In many Business Central implementations, job queues are used to automate critical background processes such as posting transactions, sending emails, synchronizing data, and running reports. However, when these jobs fail, identifying the root cause can become challenging due to limited visibility and lack of proper debugging practices. This blog provides a structured approach to monitor job queue failures, analyze error logs, and debug issues efficiently using built-in tools and development techniques. This blog explains: 1] Common reasons behind job queue failures 2] How to track failed job queue entries 3] How to debug job queue errors using AL 4] Best practices for logging and monitoring 5] Business impact of efficient job queue handling Table of Contents Customer Scenario A growing organization using Microsoft Dynamics 365 Business Central had automated multiple backend processes using Job Queues. These included: 1] Automatic posting of invoices 2] Scheduled report generation 3] Email notifications to customers 4] Data synchronization with external systems While automation improved efficiency, the team started facing frequent job queue failures. The challenges included: 1] No clear visibility of why jobs were failing 2] Errors appearing without sufficient details 3] Delays in critical processes like posting and integrations 4] Manual intervention required to restart failed jobs 5] Increased dependency on technical teams Since job queues run in the background, users were often unaware of failures until business operations were impacted. The organization needed a structured way to track, analyze, and debug job queue failures efficiently. Solution Overview To address these challenges, a systematic approach was implemented to monitor and debug job queue failures within Business Central. The goal was simple: Enable quick identification and resolution of job queue failures with minimal effort. With this approach: The workflow now looks like this: Functional Implementation Approach The implementation focuses on improving visibility, debugging capability, and system reliability. Monitoring Job Queue Entries Business Central provides a dedicated Job Queue Entries page where all scheduled jobs are listed. Key fields to monitor: 1] Status (Ready, In Process, Error) 2] Earliest Start Date/Time 3] Recurrence settings 4] Object Type and Object ID 5] Last Error Message When a job fails, the status changes to Error, which becomes the primary trigger for investigation. Analyzing Job Queue Log Entries Each job queue execution creates log entries that store execution details. These logs provide: 1] Error messages 2] Execution time 3] Call stack (in some cases) 4] Number of attempts This is the first place to check when debugging a failure. Using “Show Error” Functionality The “Show Error” action provides detailed error messages generated during execution. This helps identify: 1] Missing data 2] Invalid field values 3] Permission issues 4] Posting errors Debugging Job Queue Failures Debugging job queues requires a slightly different approach compared to normal execution. Attaching Debugger to Session Since job queues execute in the background, debugging requires attaching to the active session where the job is running. Steps: 1] Go to Help and Support page 2] Click on Attach Debugger to this Session 3] Set breakpoints in the relevant codeunit 4] Trigger or wait for the job queue to execute This method allows you to debug the exact session where the job queue is running, making it easier to trace issues in real time. Using Breakpoints in Codeunits Most job queues run codeunits. Developers should: 1] Identify the Codeunit ID from Job Queue Entry 2] Add breakpoints in key logic areas 3] Re-run the job queue 4] Step through execution Common Causes of Failures Some frequent reasons include: 1] Missing mandatory fields 2] Incorrect filters in code 3] Permission issues for background user 4] Deadlocks or record locking 5] Integration/API failures Handling Complex Scenarios In real-world implementations, job queue failures can involve complex scenarios. Logging Custom Errors Developers can enhance debugging by adding custom logs in AL code. For example: 1] Logging key variable values 2] Capturing intermediate processing steps 3] Writing meaningful error messages This makes troubleshooting faster and more accurate. Retry Mechanism Job queues support automatic retries. Proper configuration ensures: 1] Temporary issues are resolved automatically 2] Manual intervention is minimized 3] System resilience improves Handling Integration Failures When job queues interact with external systems: 1] API timeouts must be handled 2] Response validation should be implemented 3] Retry logic should be added Business Impact 1] Reduced Downtime Quick identification of job queue failures ensures minimal disruption to business operations. 2] Improved System Reliability With proper monitoring and debugging, automated processes become more stable. 3] Increased Developer Productivity Developers spend less time identifying issues and more time resolving them. 4] Faster Issue Resolution Clear logs and debugging techniques reduce troubleshooting time significantly. 5] Scalable Automation Organizations can confidently automate more processes without fear of silent failures. Preview Video The preview video demonstrates how to track and debug job queue failures in Business Central. Video highlights: Opening Job Queue Entries page Identifying failed jobs Viewing Job Queue Log Entries Using “Show Error” functionality Attaching debugger from Visual Studio Code Demo: Debugging a Job Queue Failure in Business Central Final Thoughts Job queues are a powerful feature in Microsoft Dynamics 365 Business Central that enable automation of critical business processes. However, without proper monitoring and debugging practices, failures can go unnoticed and impact operations. By implementing structured tracking, logging, and debugging techniques, organizations can transform job queue management from a reactive process into a proactive one. What was once a difficult and time-consuming troubleshooting activity can now be handled efficiently with the right approach—ensuring smooth and reliable system performance. If your Business Central environment is facing recurring job queue failures or requires optimization of background processes, consider implementing structured debugging and monitoring practices to improve overall system efficiency. Connect with CloudFront’s to get started at transform@cloudfonts.com.
Share Story :
From Default to Dynamic: Transforming Dynamics CRM Subgrids with Custom HTML for a Netherlands-Based Sustainability Certification Non-Profit
Summary A Netherlands-based sustainability certification non-profit faced a key limitation in Dynamics CRM: the default subgrid had no way to filter related lookup values — meaning all versions and levels appeared for every certification standard, regardless of relevance. CloudFronts replaced the default subgrid with a custom HTML Web Resource that renders each certification standard as an interactive card, with its own pre-filtered version and level dropdowns. Users selecting C2C Certified® Full Scope now only see versions and levels that belong to Full Scope — not a cluttered list of every record in the related table. Beyond fixing the filtering gap, the solution transformed the CRM form experience from a flat, generic grid into a clean, modern card-based interface — significantly improving usability for both applicants and assessors. Table of Contents 1. Customer Scenario 2. The Real Problem — Unfiltered Lookups in Subgrids 3. Solution Overview 4. Key Features of the Custom Web Resource 5. How It Works — Technical Implementation 6. End-to-End Walkthrough 7. Architecture & Design Decisions 8. Business Impact 9. FAQs 10. Conclusion Customer Scenario A Netherlands-based non-profit organization uses Dynamics CRM to manage the full certification application lifecycle, from initial scoping through assessment and final issuance. As part of every certification application, users must define a Certification Scope, selecting which Cradle to Cradle standards they want assessed, choosing the correct version of that standard, and setting a target certification level. The available standards include: Full Scope Material Health Circularity Each standard has its own set of applicable versions and certification levels stored in related Dataverse tables. The challenge was making sure users could select and configure each standard correctly — without the CRM form showing them irrelevant data from other standards. The Real Problem, Unfiltered Lookups in Subgrids The original design used a default CRM subgrid to list the certification scope lines. Each row in the subgrid had lookup fields pointing to related Dataverse tables, one for Standard Version, one for Certification Level. The problem was straightforward but significant: CRM subgrid lookup fields have no native mechanism to filter their values based on another field in the same row. This meant that when a user opened the Version lookup on a Full Scope row, they saw every version across every standard — Full Scope versions, Material Health versions, Circularity versions, all mixed together in a single unfiltered list. The same issue applied to Certification Levels. There was no built-in way to say: “this row is for Full Scope — only show me Full Scope versions.” Key Pain Points Wrong version selected by mistake: With all versions in one unfiltered list, users had to manually identify and pick the correct one — easy to get wrong, especially for new staff unfamiliar with which version belongs to which standard. Cluttered, confusing lookup lists: A lookup showing 20+ mixed records when the user only needs to choose from 3–4 relevant options is a frustrating experience and a source of data quality issues. No visual structure or grouping: The default subgrid renders as a flat table. There is no way to visually distinguish one standard from another or understand the overall scope at a glance. A form that did not match the product’s quality: The client wanted their CRM environment to feel professional and polished — a plain, out-of-the-box grid did not meet that expectation. The subgrid was not broken — it was simply the wrong tool for this job. What was needed was a control that understood the relationship between a standard and its versions, and filtered accordingly. Solution Overview CloudFronts replaced the default subgrid entirely with a custom HTML Web Resource embedded directly on the CRM application form. The web resource reads a single JSON field on the record which carries each standard along with its own pre-scoped list of versions and levels. The core idea: Each certification standard gets its own card → Each card shows only the versions and levels that belong to that standard → No more mixed, unfiltered lookup lists What This Achieves For Applicants and Assessors: Select one or more certification standards using clear, visual checkbox cards See only the versions relevant to the selected standard — nothing from other standards Choose a target level from a correctly filtered, correctly sorted dropdown Get instant visual feedback on which standards are active in the scope For the CRM Platform Team: No subgrid lookup filtering workarounds, form-level JavaScript hacks, or plugin-based filtering required All filtering is naturally handled by the JSON data structure, each standard row already carries only its own versions and levels A significantly better-looking form that reflects the quality of the certification program itself Scope configuration can be extended simply by updating the JSON, no schema changes needed Key Features of the Custom Web Resource The web resource was designed with two clear goals: solve the filtering problem correctly, and make the form experience noticeably better. Here is how each feature serves those goals. 1. Card-Per-Standard Layout with Checkbox Selection Each certification standard is rendered as its own self-contained card — with a title, a short description of what that standard covers, and a checkbox. This immediately solves the visual grouping problem that a flat subgrid cannot address. Clicking a card (or its checkbox) marks that standard as In Scope The selected card highlights with a blue left-border accent and a soft background tint — making it immediately clear which standards are active Deselected cards remain compact and unobtrusive, keeping the form clean For assessors reviewing multiple applications, being able to scan the full scope at a glance — without opening related records or reading through a grid — is a meaningful time saving. 2. Filtered Version Dropdown — Per Standard This is the feature that directly solves the original problem. When a card is selected and expands, the Standard Version dropdown is populated exclusively from the versions array within that standard’s JSON row. A user working on a Full Scope card sees only Full Scope versions A user working … Continue reading From Default to Dynamic: Transforming Dynamics CRM Subgrids with Custom HTML for a Netherlands-Based Sustainability Certification Non-Profit
Share Story :
Business Central Environment Transfers: What Works, What Doesn’t, and Why
Subtitle: Environment movement in Microsoft Dynamics 365 Business Central is not supported across tenants or regions – migration is the only viable approach. Author: Siddhi Patekar · Sr. Functional ConsultantSiddhi specializes in helping organizations transition from manual processes to fully digital systems using Microsoft Dynamics 365. She has worked closely with pharmaceutical manufacturers, service organizations, and the banking sector to design and implement solutions that enhance compliance, improve traceability, and drive operational efficiency. Industry: Cross-industry | Technology: Microsoft Dynamics 365 Business Central | Years of experience: 5 | Certification: MB800 Summary The Core Reality: You Don’t Transfer – You Migrate Most organizations using Microsoft Dynamics 365 Business Central eventually ask:“Can we move our environment to another tenant or region?” The answer is simple: No. This is not a limitation of configuration – it is a platform-level restriction enforced by Microsoft. Why this restriction exists: The one rule to remember: What You Cannot Do Organizations often attempt shortcuts that are simply not supported: These are not edge cases – they are hard platform constraints. What Actually Works: The Only Supported Approach The only viable method is: Recreate + Migrate A successful migration typically follows this structure: This is not a lift-and-shift – it is a controlled rebuild. What Always Breaks (Be Prepared) Every migration involves rework. The most common areas impacted: Planning for this upfront avoids delays later. Where “Transfer Environment” Actually Helps There is often confusion around this feature. Important clarification: It is useful for internal environment movement – but not for restructuring tenants. Real-World Scenario: Tenant Consolidation for Integration Situation A company was running: This resulted in: Project Goals What Should Be Done Instead A structured approach ensures success: 1. Align Tenant Strategy Early Define a single primary tenant for all business applications. 2. Plan Data Migration Properly 3. Rebuild Integrations the Right Way 4. Re-evaluate Licensing Migration is the best time to optimize licensing before renewal cycles. Business Impact Following this approach, organizations typically achieve: Frequently Asked Questions Can Business Central environments be transferred across tenants? No. Microsoft does not support cross-tenant environment transfers. Migration is the only option. Is there any way to retain integrations during migration? No. Integrations must be reconfigured in the new tenant to ensure stability and compliance. Does Microsoft provide a direct migration tool? No single tool handles full migration. A combination of RapidStart, APIs, and manual configuration is required. Conclusion The biggest misconception in Microsoft Dynamics 365 Business Central is assuming environments can be moved. They cannot. The real decision is not whether to migrate- it is when and how well you plan it. Organizations that define their tenant strategy early avoid: Those that delay the decision often face migration under pressure – when it becomes unavoidable. Thinking about restructuring your Business Central environment or tenant strategy?Plan it early, design it right, and treat migration as a strategic initiative – not a technical task. Connect with CloudFront’s to get started at transform@cloudfonts.com.
Share Story :
How We Built & Deployed a Mobile-Based Canvas App for Unified Time, Expense (with Receipts) & Material Submission with Project-Based Approvals for a US Cybersecurity Firm
Summary A US-based oil & gas cybersecurity firm implemented a mobile-first Canvas App integrated with Dynamics 365 Project Operations to unify time, expense, and material submission, tracking, and approval. The solution enabled project-specific approval workflows where only assigned approvers could validate submitted records. CloudFronts introduced a dual-mode interface (Day Mode and Week Mode) to improve usability for both field engineers and managers. Submission and approval cycle time reduced from hours/days to near real-time visibility. Table of Contents 1. Customer Scenario 2. Solution Overview 3. Key UX Features 4. Functional Implementation 5. Solution Walkthrough 6. Architecture & Integration Approach 7. Business Impact 8. FAQs 9. Conclusion Customer Scenario A Texas-based cybersecurity firm specializing in operational technology (OT) security for oil rigs manages multiple concurrent field projects using Dynamics 365 Project Operations. Employees and resources were responsible for logging: Time entries Expense entries (travel, accommodation, airfare, etc.) Material usage logs (equipment, parts, consumables, etc.) However, the system was not designed for mobile-first usage, and processes were fragmented across multiple interfaces. Key Challenges Field engineers & other Resources could not efficiently submit entries from mobile devices Time, expense, and material tracking existed in separate workflows Approval processes had to be restricted to project-specific stakeholders Project managers lacked real-time visibility into resource usage • Delays in submission can cause downstream billing and reporting issues Project tracking accuracy can get compromised, and reporting delays directly affected client communication and billing cycles. Solution Overview CloudFronts designed and deployed a unified mobile application using Power Apps (Canvas Apps) integrated with Dynamics 365 Project Operations. Objective: One app → All submissions → Controlled approvals → Real-time visibility What the App Enables For Field Users: Submit time entries (daily or weekly) Create expense entries with receipt validation Log material consumption against projects Track submission status instantly For Project Approvers: View only entries related to assigned projects Approve or reject submissions directly from mobile Maintain audit-ready approval workflows Key UX Features The application is designed with a strong focus on usability for both resources and project approvers, ensuring a seamless mobile experience across submission and approval workflows. 1. Day Mode / Week Mode Toggle The app provides a flexible entry experience through a dual-mode interface: Day Mode: Enables detailed entry for a single day, ideal for precise logging and corrections. Week Mode: Allows bulk entry across multiple days, reducing effort for repetitive data entry. This flexibility significantly improves usability across different working styles and scenarios. 2. Calendar-Based Swipe Navigation The application introduces a Dynamics-style calendar navigation with swipe support, allowing users to: Traverse across multiple days or weeks effortlessly View and manage multiple submission records in sequence Navigate between historical and current entries with minimal effort This mobile-first interaction design reduces friction in high-frequency data entry scenarios. 3. Unified Submission & Approval Experience The UI/UX is intentionally designed to mirror the complete lifecycle of a record, ensuring consistency between submission and approval stages. Each record follows a structured lifecycle aligned with Dynamics 365 stages: Submitted Pending Approved Rejected Recall Requested Recall Request Approved Recall Request Rejected The interface dynamically adapts based on the current stage: Action buttons (Approve, Reject, Recall, etc.) are conditionally visible Status indicators are clearly displayed Users experience the same structured flow from creation to closure This ensures clarity, reduces errors, and improves user confidence in the system. 4. Dynamic Action-Based UI (Smart Button Behavior) The app intelligently modifies UI controls based on record state: Submit button appears only for draft entries Approve/Reject buttons are visible only to project approvers Recall option is available only after submission Post-approval states restrict further edits This enforces role-based and state-based control, preventing invalid actions and maintaining process integrity. 5. Conditional Receipt Upload for Expense Entries Expense submission logic is enhanced with category-driven validation: Mandatory: Airline tickets, OT hardware purchases Optional: Meals, local travel This balances compliance requirements with user convenience, avoiding unnecessary friction. 6. On-Demand Data Refresh Users can manually refresh data within the app to: Fetch the latest submission and approval statuses Sync newly created or updated records Ensure real-time visibility without relying solely on background refresh Especially useful in environments with intermittent connectivity. 7. Mobile-First Interaction Design Touch-friendly controls Swipe navigation Lightweight screens for faster performance Minimal navigation depth This ensures field engineers working in remote or on-site environments can operate efficiently. Functional Implementation This section outlines how the solution was implemented within Dynamics 365 Project Operations and the Power Platform to enable end-to-end submission and approval management. 1. Unified Data Model in Dataverse All three entry types — Time, Expense, and Material — are structured within Dataverse and linked to: Project Resource (User) Approval records Supporting documents (for expenses) Each submission creates a corresponding record with a defined lifecycle stage, ensuring consistency across all entry types. 2. Submission Logic from Canvas App Each submission type follows a structured flow: User selects project and entry type (Time / Expense / Material) Required fields are validated based on entry type Conditional logic enforces: Receipt requirement (for specific expense categories) Mandatory fields (based on business rules) Record is created in Dataverse Submission triggers backend approval workflow This ensures that all records entering the system are complete, validated, and ready for approval processing. 3. Approval Record Creation & Routing Upon submission: A corresponding approval record is automatically created The system identifies project-specific approvers Key behavior: Only assigned project approvers can view and act on records Approval actions update the main record status 4. Record Lifecycle Management (Status-Driven System) Lifecycle: Draft → Submitted → Pending → Approved / Rejected → Recall Flow Users submit records → moves to Submitted Approvers review → Approved or Rejected Users request recall → Recall Requested Approvers respond → Recall Approved or Rejected Controlled through: Power Apps UI logic MS Bound Actions for submission and approval handling Dataverse status fields 5. Expense Receipt Handling (Integrated from Previous Solution) Receipt upload enforced conditionally Files stored as Notes (Annotations) in Dataverse Linked to expense records This eliminates manual document handling and ensures compliance. Solution Walkthrough The following walkthrough … Continue reading How We Built & Deployed a Mobile-Based Canvas App for Unified Time, Expense (with Receipts) & Material Submission with Project-Based Approvals for a US Cybersecurity Firm
Share Story :
Building a Reliable Bronze Silver Gold Data Pipeline in Databricks for Enterprise Reporting
Summary Modern analytics platforms require structured data pipelines that ensure reliability, consistency, and governance across reporting systems. Traditional ETL approaches often struggle to scale as data volume and complexity increase. This blog explains how the Bronze–Silver–Gold (Medallion) architecture in Databricks provides a scalable and reliable framework for organizing data pipelines. It highlights how each layer serves a specific purpose, enabling better data quality, governance, and seamless integration with reporting tools such as Power BI. The Real Problem: Reporting Pipelines Become Fragile Over Time In many organizations: This leads to unreliable reporting and increased maintenance effort. What Is the Bronze–Silver–Gold Architecture? The Medallion architecture organizes data into three layers: Bronze Layer Raw data ingestion layer. Silver Layer Cleaned and standardized data. Gold Layer Business-ready, reporting-optimized data. Each layer has a clear responsibility. Bronze Layer: Raw Data Ingestion Purpose Key Characteristics Bronze acts as the system of record. Silver Layer: Data Standardization Purpose Key Activities Silver creates reusable datasets across reporting use cases. Gold Layer: Reporting-Ready Data Purpose Key Characteristics Gold tables are consumed directly by reporting tools. Why This Architecture Works 1. Separation of Concerns Each layer has a defined role, reducing complexity. 2. Improved Data Quality Data is progressively refined from raw to curated. 3. Better Performance Reporting queries run on optimized Gold tables. 4. Governance with Unity Catalog Access can be controlled at each layer: Common Implementation Mistakes These mistakes lead to long-term instability. Business Impact To conclude, the Bronze–Silver–Gold architecture provides a strong foundation for building scalable and reliable data pipelines in Databricks. When combined with proper governance and disciplined design, it enables organizations to deliver consistent, high-quality data for analytics and decision-making. We hope you found this article useful. If you would like to explore how AI-powered customer service can improve your support operations, please contact us at transform@cloudfronts.com.
Share Story :
Stop Hard-Coding Recipients: Streamlining Email Automation with Dataverse and Power Automate for a U.S.-Based Window and Door Manufacturer
Summary A window and door manufacturing company based in the United States, specializing in energy-efficient fenestration products, eliminated brittle hard-coded email recipients from their sales automation by adopting a Dataverse-driven approach in Power Automate. CloudFronts implemented a dynamic recipient resolution pattern using coalesce and createArray expressions, pulling the To and CC parties directly from opportunity record lookups in Microsoft Dynamics 365 CRM. The solution handles null lookups gracefully, scales as team structures change, and requires zero flow edits when personnel or roles shift. Business impact: Reduced flow maintenance overhead, eliminated misdirected emails caused by stale hard-coded addresses, and established a reusable pattern applicable across multiple automation scenarios. About the Customer The customer is a U.S.-based manufacturer of custom steel windows and doors, serving commercial, residential, and architectural projects. Established in the mid-1980s, the company specializes in high-performance, energy-efficient fenestration systems designed for both modern and heritage applications. They rely on Microsoft Dynamics 365 CRM to manage their sales pipeline, opportunity tracking, and customer communications across a distributed sales network. Their sales process involves multiple stakeholders per opportunity, including an opportunity owner, primary customer contact, forwarding representative, and regional sales representative, all of whom may need to be included in outbound communications at different stages of the deal lifecycle. The Challenge When the organization first automated opportunity-related emails through Power Automate, recipient addresses were defined statically inside the flow. A specific mailbox was hard-coded as the CC address, and To recipients were manually entered per scenario. This approach worked initially but quickly became a source of ongoing problems: Stale recipients: When team members changed roles or left the organization, flows continued sending emails to incorrect or inactive addresses, requiring a developer to open the flow and update it manually every time. No relationship to CRM data: The recipient list in the flow had no connection to who was actually assigned to the opportunity in Dynamics 365 CRM. The two could easily fall out of sync. Scalability and maintenance burden: As the number of automated flows grew, so did the number of places where email addresses were hard-coded. A single personnel change could require updates across multiple flows, increasing both effort and the risk of missing one. Inability to handle variable stakeholders: Not every opportunity has the same set of involved parties. Some have a forwarding representative, others do not. Some have a dedicated sales representative assigned, while others rely only on the owner. A static recipient list cannot handle this variability. The organization needed a recipient model that was driven entirely by what was recorded in CRM, not by what a developer had typed into a flow months earlier. The Solution CloudFronts redesigned the email automation to resolve all recipients dynamically at runtime, using lookup field values from the opportunity record in Dataverse. No email addresses are stored in the flow itself. Technologies Used Microsoft Dynamics 365 CRM, Source of opportunity data, ownership, and stakeholder relationships Power Automate, Orchestration layer for the email automation Dataverse connector, Real-time retrieval of opportunity record and related lookup fields Email activity (CRM), Target entity for structured email creation with party list support What CloudFronts Configured The flow fetches the opportunity record from Dataverse as its first action after the trigger. From that single record, four lookup fields are evaluated, the record owner (_ownerid_value), the opportunity contact (_cf_opportunitycontact_value), a forwarding sales representative (_ow_forwardingtosalesrep_value), and the primary sales representative (_ow_salesrep_value). Each lookup is conditionally included in the recipient array only if it is not null. If a lookup field has no value on a given opportunity, it is excluded entirely, the flow does not error, and no placeholder address fills the gap. The recipient array is constructed using a single coalesce + createArray expression, producing a clean party list that is passed directly into the email activity creation step. The participationtypemask value distinguishes the To recipient (mask 1, the owner via systemusers) from CC recipients (mask 2, contacts). Power Automate Flow Walkthrough The diagram above illustrates the end-to-end structure of the flow. Below is a breakdown of each stage. Step 1, Trigger The flow is triggered by a CRM event such as an opportunity stage change, a manual button, or a scheduled recurrence. Step 2, Get opportunity record A Dataverse action retrieves the full opportunity record including all lookup fields. Step 3, Build the recipients array This is the core of the solution: coalesce( createArray( if( not(equals(outputs(‘Get_Opportunity_Record’)?[‘body/_ownerid_value’], null)), json(concat( ‘{"participationtypemask": 1,"partyid@odata.bind": "systemusers(‘, outputs(‘Get_Opportunity_Record’)?[‘body/_ownerid_value’], ‘)"}’ )), null ), if( not(equals(outputs(‘Get_Opportunity_Record’)?[‘body/_cf_opportunitycontact_value’], null)), json(concat( ‘{"participationtypemask": 2,"partyid@odata.bind": "contacts(‘, outputs(‘Get_Opportunity_Record’)?[‘body/_cf_opportunitycontact_value’], ‘)"}’ )), null ), if( not(equals(outputs(‘Get_Opportunity_Record’)?[‘body/_cf_forwardingtosalesrep_value’], null)), json(concat( ‘{"participationtypemask": 2,"partyid@odata.bind": "contacts(‘, outputs(‘Get_Opportunity_Record’)?[‘body/_cf_forwardingtosalesrep_value’], ‘)"}’ )), null ), if( not(equals(outputs(‘Get_Opportunity_Record’)?[‘body/_cf_salesrep_value’], null)), json(concat( ‘{"participationtypemask": 2,"partyid@odata.bind": "contacts(‘, outputs(‘Get_Opportunity_Record’)?[‘body/_cf_salesrep_value’], ‘)"}’ )), null ) ) ) Each lookup is checked for null and included only when present, producing a clean, variable-length recipient list from CRM data. Step 4, Null checks per lookup Missing stakeholders are simply excluded without breaking the flow. Step 5, Create email activity The recipient list is passed into Dataverse email activity creation. Step 6, Email sent Recipients are resolved dynamically from CRM data at runtime. Business Impact Metric Before After Recipient source Hard-coded in flow Live from CRM opportunity record Personnel change handling Manual flow edit required Automatic, CRM update is sufficient Variable stakeholder support Not possible Supported natively Misdirected email risk High Eliminated Flow maintenance effort Per-change developer intervention None for recipient changes The organization now operates email automation where the flow itself never needs to be edited when team structures shift. Updating the opportunity record in CRM is the single source of truth, and the flow responds accordingly at runtime. Frequently Asked Questions What if all lookup fields are null on an opportunity? The createArray expression will produce an array of null values, and coalesce will return an empty or minimal array. It is recommended to add a condition step before the email creation to check that at least one valid recipient exists and to handle the empty case, such as logging a CRM note or notifying an administrator, rather than … Continue reading Stop Hard-Coding Recipients: Streamlining Email Automation with Dataverse and Power Automate for a U.S.-Based Window and Door Manufacturer
Share Story :
Stop Creating Entities: Simplifying CRM with JSON and Custom HTML for a Sustainability Certification Non-Profit in the Netherlands
Summary A non-profit sustainability certification organization reduced CRM complexity by replacing multiple custom entities with a JSON-based data structure in Microsoft Dynamics 365 CRM. CloudFronts implemented a custom HTML interface to dynamically render input fields and manage document uploads within a single, unified UI. The approach eliminated repeated schema changes, reduced admin overhead, and enabled faster adaptation to evolving certification requirements. Business impact: Reduced CRM customization overhead, accelerated onboarding of new certification types, and a more maintainable solution that scales without structural rework. About the Customer The customer is a non-profit organization focused on sustainability certification across industries. They operate across multiple certification programs, each with distinct documentation requirements, input fields, and approval workflows. Their team relies on Microsoft Dynamics 365 CRM as the central platform for managing certification applications, applicant data, and compliance records. The Challenge Microsoft Dynamics 365 CRM is built for structured data — but not all business processes follow fixed structures. The organization managed several certification programs, each requiring different sets of input fields, document uploads, and validation logic. Initially, each new certification type was handled by creating a new custom entity or modifying existing ones to accommodate the required fields. While this worked for a small number of programs, the approach quickly revealed significant limitations: Schema rigidity: Every time a new certification type was introduced, or an existing one updated, the CRM schema had to be modified. This meant new fields, new relationships, and repeated deployment cycles. Administrative overhead: Each schema change required coordination between developers and CRM administrators, creating delays and dependency bottlenecks. Inconsistent UI experience: With different entities handling different certification types, the user interface lacked consistency. Applicants and internal users faced a fragmented experience depending on which program they were working in. Scalability ceiling: The entity-per-program model was not designed to scale. Adding a tenth or fifteenth certification type would exponentially increase the complexity of the CRM data model. Document management friction: Handling document uploads across multiple entities was cumbersome, with no unified approach to tracking submission status or linking files to the correct certification record. The organization needed a solution that could accommodate evolving certification structures without requiring constant schema modifications or developer intervention. The Solution CloudFronts redesigned the data architecture by replacing the multi-entity model with a JSON-based structure stored within Dynamics 365 CRM, paired with a custom HTML interface to dynamically render the appropriate fields and manage document workflows. Technologies Used Microsoft Dynamics 365 CRM, Core platform for certification records, applicant data, and workflow management JSON, Flexible data structure for storing dynamic certification inputs within a single CRM field Custom HTML with JavaScript, Dynamic front-end interface rendered within the CRM form, replacing static entity-based layouts Power Automate, Supporting workflows for notifications, approvals, and document status updates What CloudFronts Configured Rather than creating a separate entity for each certification type, CloudFronts introduced a single Certification Application entity with a dedicated JSON field to store all variable inputs. A configuration-driven approach was used, each certification type is defined by a schema that specifies which fields to show, what validations to apply, and which documents are required. The custom HTML interface reads this configuration at runtime and dynamically renders the correct form, no code changes required when a new certification type is added or an existing one is modified. The same interface handles document uploads, linking each file to its corresponding certification record and tracking submission status in real time. CloudFronts also implemented role-based visibility within the HTML component, ensuring that internal reviewers, applicants, and administrators each see only the sections relevant to their function. Business Impact Metric Before After Adding a new certification type Requires schema changes and deployment Configuration update only UI consistency Fragmented across entities Unified interface for all programs Developer dependency High, every change needed development effort Low, administrators manage configurations Document tracking Manual, per entity Centralized and automated CRM data model complexity Growing with each program Stable and maintainable The organization can now onboard new certification programs in a fraction of the time, without touching the underlying CRM schema. Internal teams manage certification configurations independently, and the development team focuses on feature improvements rather than reactive schema maintenance. Frequently Asked Questions When should I use JSON instead of CRM entities? JSON is a strong fit when input structures vary frequently, differ across record types, or are driven by business rules that change regularly. If your data model is stable and relational, entities remain the better choice. Is it possible to query or filter on JSON data in CRM? Direct filtering on JSON fields in Dynamics 365 is limited. CloudFronts structured the solution so that key filterable attributes, such as certification type, status, and applicant ID, remain as standard CRM fields, while the variable payload lives in JSON. Does the custom HTML approach work on mobile? Yes. The HTML web resource is built to be responsive and functions within the Dynamics 365 mobile app, though optimal use is on desktop given the complexity of certification forms. Can this approach support approval workflows? Yes. Power Automate workflows trigger based on standard CRM field changes, such as status updates, and do not depend on the JSON structure, keeping workflow logic clean and maintainable. Conclusion Not every data problem in CRM needs a new entity. When business requirements are variable and evolving, as they often are in certification, compliance, and document-heavy workflows, a rigid entity model can become a liability rather than an asset. By combining JSON-based storage with a dynamic HTML interface, CloudFronts helped this organization build a CRM solution that adapts to change without requiring structural rework. The result is a leaner data model, a more consistent user experience, and a team that can move faster because they are no longer dependent on developer cycles for every process update. Sometimes the best CRM architecture is the one that knows when not to add more to the schema. We hope you found this article useful. If you would like to explore how AI-powered customer service can improve your support … Continue reading Stop Creating Entities: Simplifying CRM with JSON and Custom HTML for a Sustainability Certification Non-Profit in the Netherlands
Share Story :
From Manual to Automated: Scalable Client Statement Reporting with Power BI for a Houston-Based Enterprise Security Services Firm
Summary A services firm based in Houston, Texas, specializing in enterprise security solutions, improved operational efficiency by transitioning from Excel-based reporting to Power BI Paginated Reports, implemented by CloudFronts. CloudFronts designed a structured, client-ready reporting solution integrated with Dynamics 365 CRM. The solution supports manual distribution today while being fully prepared for future automation such as scheduled PDF delivery. Business impact: Improved operational efficiency, standardized reporting, and scalability without rework. Client-ready account statement using Power BI Paginated Reports About the Customer As a 9x Microsoft Gold Partner and 6x Microsoft Advanced Specialization-endorsed organization based in Texas, U.S., the customer specializes in delivering solutions for critical business needs across systems management, security, data insights, and mobility. The Challenge Initially, the organization generated account statements manually using Excel for a small number of clients. While this approach worked at a smaller scale, it presented several limitations: Manual effort and inefficiency: Reports had to be created individually for each client. Lack of standardization: Formatting and structure varied across reports. Scalability concerns: While effective for a small client base, the process was not designed to scale as the business grows to 30–50+ clients. Technology decision gap: The team required guidance on choosing between SSRS and Power BI Paginated Reports, along with future automation capabilities. As a result, the organization needed a solution that addressed current inefficiencies while preparing for future scale. The Solution CloudFronts implemented Power BI Paginated Reports, integrated with Dynamics 365 CRM, to create structured, print-ready account statements. Technologies Used Dynamics 365 CRM — Source of funding, account, and transaction data Power BI Paginated Reports — Designed pixel-perfect, client-facing statements Power BI Service — Enabled hosting and future automation capabilities What CloudFronts Configured CloudFronts designed a paginated report tailored for client communication, including account summaries, transaction-level details, and allocation tracking. The solution includes parameterized filtering for month, account, and funding status, enabling efficient report generation across multiple clients. The report was built with a strong emphasis on consistency, print-ready formatting, and reusability—ensuring that reports can be generated without redesign as the business grows. CloudFronts also guided the customer in selecting Power BI Paginated Reports over SSRS to ensure better alignment with the Power BI ecosystem and support for future automation such as subscription-based PDF delivery. Key Implementation Decisions Replacing Excel with Paginated Reports: Improved standardization and reduced manual effort. Choosing Paginated Reports over SSRS: Enabled seamless integration with Power BI Service and future automation readiness. Designing for scalability: Built a solution that works manually today but supports automation in the future. Business Impact Metric Before After Report Creation Manual Excel-based System-generated reports Operational Efficiency Low Significantly improved Scalability Limited Ready for growth Consistency Variable Standardized The organization now operates with a structured reporting system that reduces manual effort while being fully prepared for future automation. Frequently Asked Questions Should I use SSRS or Power BI Paginated Reports? If you are using Power BI, Paginated Reports are a better choice due to seamless integration and future automation support. Can I automate PDF report delivery later? Yes. Paginated Reports support subscription-based delivery for automated PDF emails. Do I need automation from day one? No. It is more effective to design a scalable solution first and introduce automation as the business grows. Conclusion This implementation highlights that effective reporting is not just about automation—it is about designing for scalability from the beginning. By choosing Power BI Paginated Reports, the organization built a solution that meets current needs while avoiding future rework as they grow. Not every reporting requirement needs a dashboard or immediate automation. A well-designed structured report can often be the most scalable solution. Read the full case study here: Invoke We hope you found this article useful. If you would like to explore how AI-powered customer service can improve your support operations, please contact us at transform@cloudfronts.com. Deepak Chauhan | Consultant, CloudFronts
Share Story :
Understanding the Difference Between Temporary Tables and SourceTableTemporary in Business Central
Summary In Microsoft Dynamics 365 Business Central, performance and data handling are critical especially when dealing with intermediate calculations, staging data, or processing large datasets. Developers often come across two commonly used approaches: At first glance, both seem to do the same thing: store data temporarily without writing to the database. But in reality, they serve different purposes and behave differently in real-world scenarios. This blog explains: 1] What Temporary Tables are 2] What SourceTableTemporary is 3] Key differences between them 4] When to use which approach 5] Real-world development scenarios Table of Contents The Real Problem: Handling Temporary Data Efficiently Let’s take a real development scenario. You are building a customization where: Example Use Cases 1] Generating preview reports 2] Aggregating data before posting 3] Showing calculated insights on a page 4] Temporary staging before validation The Challenge If you use normal tables: If you misuse temporary structures: So the key question becomes: Should you use a Temporary Table variable or SourceTableTemporary? What are Temporary Tables? Temporary tables are record variables that exist only in memory and are not stored in the SQL database. Key Characteristics var TempSalesLine: Record “Sales Line” temporary; Behavior Example TempSalesLine.Init();TempSalesLine.”Document No.” := ‘TEMP001’;TempSalesLine.Insert(); This record exists only during runtime and never touches the database. What is SourceTableTemporary? SourceTableTemporary is a Page-level property. It makes the entire page operate on a temporary version of its Source Table. Definition SourceTableTemporary = true; Key Characteristics Behavior Example trigger OnOpenPage()begin Rec.Init(); Rec.”No.” := ‘TEMP001’; Rec.Insert();end; Here, Rec is temporary because the page is set to SourceTableTemporary = true. Key Differences Aspect Temporary Table SourceTableTemporary Scope Variable-level Page-level Usage Backend logic UI Pages Data Lifetime Until variable is cleared Until page is closed Control Full AL control Page-driven UI Binding Not directly bound to UI Directly bound to UI Use Case Processing, calculations Displaying temporary data Practical Scenarios Scenario 1: Data Processing Logic You are calculating totals before posting a document. Use Temporary Tables Why? Scenario 2: Showing Preview Data on a Page You want to show: Use SourceTableTemporary Why? Scenario 3: Hybrid Use Case Sometimes you: Best Practice: Why Choosing the Right Approach Matters Using the wrong approach can lead to: Problem Cause Data not visible on UI Using only temporary variables Performance issues Writing unnecessary records Complex cleanup logic Using physical tables instead of temporary UI inconsistency Misusing SourceTableTemporary Business Impact 1. Improved Performance Temporary data handling reduces database load and improves execution speed. 2. Cleaner Data Architecture No unnecessary records stored → no cleanup jobs required. 3. Better User Experience Users can preview and interact with data without affecting actual records. 4. Safer Development Practices Avoids accidental data writes and improves system stability. 5. Flexible Customizations Developers can build simulation, preview, and staging features easily. 6. Reduced Maintenance Effort No need for background jobs to delete temporary records. Final Thoughts Both Temporary Tables and SourceTableTemporary are powerful tools—but they are not interchangeable. Think of it like this: Choosing the right one depends on where your logic lives: I hope you found this blog useful! “Discover How We’ve Enabled Businesses Like Yours – Explore Our Client Testimonials!” Please feel free to connect with us at transform@cloudfronts.com
Share Story :
Optimizing Power BI Dataset Performance Using Incremental Refresh for Large-Scale Analytics.
Summary Use Case / Why This Matters Prerequisites Before implementing incremental refresh in Microsoft Power BI, ensure the following: Step-by-Step Implementation Step 1: Create Parameters (RangeStart & RangeEnd) This step defines the data boundaries for incremental refresh. These parameters will control which data gets refreshed. Step 2: Apply Filter in Power Query This step filters the dataset using the parameters. Select your date column Apply filter: DateColumn >= RangeStart AND DateColumn < RangeEnd This ensures only relevant data is processed. Step 3: Enable Query Folding This step ensures filtering happens at the data source level. Right-click last step → View Native Query If available → Query folding is enabled Query folding is critical for performance optimization. Step 4: Configure Incremental Refresh Policy This step defines how much data to store and refresh. This creates partitions in the dataset. Step 5: Publish to Power BI Service This step activates incremental refresh in the cloud. After publishing, Power BI automatically manages partitions. Business Impact Following the implementation, organizations achieved the following results Metric Before After Dataset refresh time 2–3 hours (full refresh) 30–45 minutes Data processing load Entire dataset processed Only recent data processed Report performance Slow with large datasets Faster load & interaction System resource usage High Optimized and controlled Incremental refresh significantly improves scalability and ensures consistent performance for enterprise reporting. To conclude, Incremental refresh in Microsoft Power BI transforms how organizations handle large datasets by reducing refresh times and improving performance. By implementing proper data filtering, query folding, and refresh policies, businesses can scale their analytics without compromising speed. As data volumes continue to grow, adopting incremental refresh is no longer optional—it is essential for efficient and cost-effective reporting. If your Power BI reports are slowing down due to large datasets, start implementing Incremental Refresh today. Begin by identifying your date columns, defining parameters, and configuring refresh policies. A small change can lead to massive performance improvements in your reporting environment. We hope you found this blog useful. If you would like to learn more or discuss similar solutions, feel free to reach out to us at transform@cloudfronts.com.